Goto

Collaborating Authors

 dataset distillation



Diversity-Driven Synthesis: Enhancing Dataset Distillation through Directed Weight Adjustment

Neural Information Processing Systems

To avoid redundancy in these synthetic datasets, it is crucial that each element contains unique features and remains diverse from others during the synthesis stage. In this paper, we provide a thorough theoretical and empirical analysis of diversity within synthesized datasets. We argue that enhancing diversity can improve the parallelizable yet isolated synthesizing approach.





Sequential Subset Matching for Dataset Distillation

Neural Information Processing Systems

The synthetic datasets are expected to capture the essence of the knowledge contained in real-world datasets such that the former yields a similar performance as the latter.



MGDD: A Meta Generator for Fast Dataset Distillation

Neural Information Processing Systems

The meta generator is termed as MGDD in our approach. Once adapted, it can handle arbitrary sizes of synthetic datasets, even for those unseen during adaptation.



Color-Oriented Redundancy Reduction in Dataset Distillation

Neural Information Processing Systems

In this paper, we propose AutoPalette, a framework that minimizes color redundancy at the individual image and overall dataset levels, respectively. At the image level, we employ a palette network, a specialized neural network, to dynamically allocate colors from a reduced color space to each pixel. The palette network identifies essential areas in synthetic images for model training and consequently assigns more unique colors to them. At the dataset level, we develop a color-guided initialization strategy to minimize redundancy among images.